How can the findings be reported and their use supported?
Make more visible and clear the outcomes of any evaluation and impact study for better use in future training.
Introduction:
The evaluation report should be structured in a manner that reflects the purpose and questions of the evaluation and should be clearly addressed for proposing changes and proposing adaptation in the future model of the training.
The specific evaluative rubrics should be used to ‘interpret’ the evidence and determine which considerations are critically important or urgent. Evidence on multiple dimensions should subsequently be synthesized to generate answers to the high-level evaluative questions.
Content:
The structure of an evaluation report can do a great deal to encourage the succinct reporting of direct answers to evaluative questions, backed up by enough detail about the evaluative reasoning and methodology to allow the reader to follow the logic and clearly see the evidence base.
A facilitator should be able to design and planning such development of the evaluation focusing already on the future steps and not only focusing on the nowadays or short term vision of using the evaluation data.
The following recommendations will help to set clear expectations for evaluation reports that are strong on evaluative reasoning:
- The executive summary must contain direct and explicitly evaluative answers to the questions used to guide the whole evaluation.
- Explicitly evaluative language must be used when presenting findings (rather than value-neutral language that merely describes findings). Examples should be provided.
- Use of clear and simple data visualization to present easy-to-understand ‘snapshots’ of how the intervention has performed on the various dimensions of merit.
- Structuring of the findings section using questions as subheadings (rather than types and sources of evidence, as is frequently done).
- There must be clarity and transparency about the evaluative reasoning used, with the explanations clearly understandable to both non-evaluators and readers without deep content expertise in the subject matter. These explanations should be broad and brief in the main body of the report, with more detail available in annexes.
- If evaluative rubrics are relatively small in size, these should be included in the main body of the report. If they are large, a brief summary of at least one or two should be included in the main body of the report, with all rubrics included in full in an annex.
Exercises:
How to apply it in everyday work?
While you prepare your next evaluation, would it be possible to start to think about the final format of the reporting?
Preparing your next training would be important to address the following point while preparing the evaluation.
- How does the audience prefer to receive information – text, graphics, numbers, written, visual or a mixture of all of these?
- What is the preferred length (or duration if an audio/visual presentation)?
- What access does the audience have to information technology (this may inform whether you use web-based formats)?
- What is the purpose of the report and how does this inform the choice of format? Purposes may include:
- keeping stakeholders engaged during an evaluation
- providing feedback to and maintaining the commitment of people collecting data during implementation
- flagging emerging findings and implications for ongoing program development and for the evaluation
- presenting interim recommendations
- seeking feedback on draft reports to assist in identifying causal factors
- informing planning, funding or policy decisions
- broader dissemination of findings to support the use
Reflection Questions:
- How often do I think in advance to the final use that I will do about training evaluation?
- Do I plan something strategically or only for one activity ahead?
- Am I capable to prepare a report that is connecting all the above elements in one short and simple document?